# BNE pre-training
Roberta Base Bne Capitel Ner Plus
Apache-2.0
A Spanish named entity recognition (NER) model based on the RoBERTa architecture, pre-trained on the BNE corpus and fine-tuned on the CAPITEL dataset, with optimized performance for lowercase named entity recognition.
Sequence Labeling
Transformers Supports Multiple Languages

R
PlanTL-GOB-ES
1,481
7
Roberta Large Bne Sqac
Apache-2.0
This is a RoBERTa large model optimized for Spanish question answering tasks, trained on a large corpus from the Spanish National Library
Question Answering System
Transformers Spanish

R
PlanTL-GOB-ES
966
8
Roberta Large Bne Capitel Ner
Apache-2.0
This model is a Spanish named entity recognition model based on the RoBERTa architecture, pre-trained on the large-scale BNE corpus and fine-tuned using the CAPITEL-NERC dataset
Sequence Labeling
Transformers Supports Multiple Languages

R
PlanTL-GOB-ES
370
0
Roberta Base Bne Capitel Ner
Apache-2.0
Spanish named entity recognition model based on RoBERTa architecture, pre-trained on BNE corpus and fine-tuned on CAPITEL-NERC dataset
Sequence Labeling
Transformers Supports Multiple Languages

R
PlanTL-GOB-ES
8,221
3
Featured Recommended AI Models